Constructing Nonlinear Discriminants from Multiple Data Views

نویسندگان

  • Tom Diethe
  • David R. Hardoon
  • John Shawe-Taylor
چکیده

There are many situations in which we have more than one view of a single data source, or in which we have multiple sources of data that are aligned. We would like to be able to build classifiers which incorporate these to enhance classification performance. Kernel Fisher Discriminant Analysis (KFDA) can be formulated as a convex optimisation problem, which we extend to the Multiview setting (MFDA) and introduce a sparse version (SMFDA). We show that our formulations are justified from both probabilistic and learning theory perspectives. We then extend the optimisation problem to account for directions unique to each view (PMFDA). We show experimental validation on a toy dataset, and then give experimental results on a brain imaging dataset and part of the PASCAL 2007 VOC challenge dataset.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Classification and Reductio-ad-Absurdurn

Proofs for the optimality of classification in real-world machine learning situations are constructed. The validity of each proof requires reasoning about the probability of certain subsets of feature vectors. It is shown that linear discriminants classify by making the least demanding assumptions on the values of these probabilities. This enables measuring the confidence of classification by l...

متن کامل

Constructing Pairing-friendly Curves with Variable Cm Discriminant

A new algorithm is proposed for the construction of BrezingWeng-like elliptic curves such that polynomials defining the CM discriminant are linear. Using this construction, new families of curves with variable discriminants and embedding degrees of k ∈ {8, 16, 20, 24}, which were not covered by Freeman, Scott, and Teske [9], are presented. Our result is useful for constructing elliptic curves w...

متن کامل

Boosted Dyadic Kernel Discriminants

We introduce a novel learning algorithm for binary classification with hyperplane discriminants based on pairs of training points from opposite classes (dyadic hypercuts). This algorithm is further extended to nonlinear discriminants using kernel functions satisfying Mercer’s conditions. An ensemble of simple dyadic hypercuts is learned incrementally by means of a confidence-rated version of Ad...

متن کامل

Multi-view low-rank sparse subspace clustering

Most existing approaches address multi-view subspace clustering problem by constructing the affinity matrix on each view separately and afterwards propose how to extend spectral clustering algorithm to handle multi-view data. This paper presents an approach to multi-view subspace clustering that learns a joint subspace representation by constructing affinity matrix shared among all views. Relyi...

متن کامل

Towards Self-Exploring Discriminating Features

Many visual learning tasks are usually confronted by some common diiculties. One of them is the lack of supervised information, due to the fact that labeling could be tedious, expensive or even impossible. Such scenario makes it challenging to learn object concepts from images. This problem could be alleviated by taking a hybrid of labeled and unlabeled training data for learning. Since the unl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010